6,220 research outputs found

    Spatio-temporal optimization of tree removal to efficiently minimize crown fire potential

    Get PDF
    High-intensity wildfires have resulted in large financial, social, and environmental costs in the western U.S. This trend is not expected to decline soon, as there are millions of overstocked hectares at medium to high risk of catastrophic wildfires. Thinning is being widely used to restore different types of overstocked forest stands. Typically, thinning prescriptions are derived from average stand attributes and applied to landscapes containing a large number of stands. Stand-level thinning prescriptions have thus limitations when applied for reducing the risk of high-intensity wildfires. They use indicators of crown fire potential (e.g., canopy base height and canopy bulk density) that ignore variability of fuels within stands, location of individual cut- and leave-trees after treatments, and the temporal effects of these prescriptions for reducing crown fire potential over time. To address the limitations of current stand-level thinning prescriptions, a computerized approach to optimize individual tree removal and produce site-specific thinning prescriptions was designed. Based on stem maps and tree attributes derived from light detection and technology (LiDAR), the approach predicts individual tree growth over time, quantifies tree-level fuel connectivity, and estimates skidding costs for individual trees. The approach then selects the spatial combination of cut-trees that most efficiently reduces crown fire potential over time while ensuring cost efficiency of the thinning treatment

    Applying ant colony optimization (ACO) metaheuristic to solve forest transportation planning problems with side constraints

    Get PDF

    Remote Sensing of Forests using Discrete Return Airborne LiDAR

    Get PDF
    Airborne discrete return light detection and ranging (LiDAR) point clouds covering forested areas can be processed to segment individual trees and retrieve their morphological attributes. Segmenting individual trees in natural deciduous forests, however, remained a challenge because of the complex and multi-layered canopy. In this chapter, we present (i) a robust segmentation method that avoids a priori assumptions about the canopy structure, (ii) a vertical canopy stratification procedure that improves segmentation of understory trees, (iii) an occlusion model for estimating the point density of each canopy stratum, and (iv) a distributed computing approach for efficient processing at the forest level. When applied to the University of Kentucky Robinson Forest, the segmentation method detected about 90% of overstory and 47% of understory trees with over-segmentation rates of 14 and 2%. Stratifying the canopy improved the detection rate of understory trees to 68% at the cost of increasing their over-segmentations to 16%. According to our occlusion model, a point density of ~170 pt/m2 is needed to segment understory trees as accurately as overstory trees. Lastly, using the distributed approach, we segmented about two million trees in the 7440-ha forest in 2.5 hours using 192 processors, which is 167 times faster than using a single processor

    Forest Understory Trees Can Be Segmented Accurately Within Sufficiently Dense Airborne Laser Scanning Point Clouds

    Get PDF
    Airborne laser scanning (LiDAR) point clouds over large forested areas can be processed to segment individual trees and subsequently extract tree-level information. Existing segmentation procedures typically detect more than 90% of overstory trees, yet they barely detect 60% of understory trees because of the occlusion effect of higher canopy layers. Although understory trees provide limited financial value, they are an essential component of ecosystem functioning by offering habitat for numerous wildlife species and influencing stand development. Here we model the occlusion effect in terms of point density. We estimate the fractions of points representing different canopy layers (one overstory and multiple understory) and also pinpoint the required density for reasonable tree segmentation (where accuracy plateaus). We show that at a density of ~170 pt/m² understory trees can likely be segmented as accurately as overstory trees. Given the advancements of LiDAR sensor technology, point clouds will affordably reach this required density. Using modern computational approaches for big data, the denser point clouds can efficiently be processed to ultimately allow accurate remote quantification of forest resources. The methodology can also be adopted for other similar remote sensing or advanced imaging applications such as geological subsurface modelling or biomedical tissue analysis

    Forest Understory Trees Can Be Segmented Accurately Within Sufficiently Dense Airborne Laser Scanning Point Clouds

    Get PDF
    Airborne laser scanning (LiDAR) point clouds over large forested areas can be processed to segment individual trees and subsequently extract tree-level information. Existing segmentation procedures typically detect more than 90% of overstory trees, yet they barely detect 60% of understory trees because of the occlusion effect of higher canopy layers. Although understory trees provide limited financial value, they are an essential component of ecosystem functioning by offering habitat for numerous wildlife species and influencing stand development. Here we model the occlusion effect in terms of point density. We estimate the fractions of points representing different canopy layers (one overstory and multiple understory) and also pinpoint the required density for reasonable tree segmentation (where accuracy plateaus). We show that at a density of ~170 pt/m² understory trees can likely be segmented as accurately as overstory trees. Given the advancements of LiDAR sensor technology, point clouds will affordably reach this required density. Using modern computational approaches for big data, the denser point clouds can efficiently be processed to ultimately allow accurate remote quantification of forest resources. The methodology can also be adopted for other similar remote sensing or advanced imaging applications such as geological subsurface modelling or biomedical tissue analysis

    Quantifying the Effects of Biomass Market Conditions and Policy Incentives on Economically Feasible Sites to Establish Dedicated Energy Crops

    Get PDF
    This study used a spatially-explicit model to identify the amount and spatial distribution of economically feasible sites for establishing dedicated energy crops under various market and policy scenarios. A sensitivity analysis was performed for a biomass market with different discount rates and biomass prices as well as policy scenarios including propriety tax exemption, carbon offset payments, and the inclusion of farmland for biomass production. The model was applied to a four-county study area in Kentucky representing conditions commonly found in the Ohio River Valley. Results showed that both biomass price and discount rate have a can strongly influence the amount of economically efficient sites. Rising the biomass price by 5 ⋅t−1andloweringdiscountrateby1·t−1 and lowering discount rate by 1% from the baseline scenario (40 ·t−1 and 5%) resulted in an over fourteen fold increment. Property tax exemption resulted in a fourfold increase, a carbon payment on only 1 $·t−1 caused a twelve fold increase and extending the landbase from marginal land to farmland only slightly increase the economically efficient sites. These results provide an objective evaluation of market and policy scenarios in terms of their potential to increase land availability for establishing dedicated energy crops and to promote the bioenergy industry

    Two short mass-loss events that unveil the binary heart of Minkowski's Butterfly Nebula

    Full text link
    Studying the appearance and properties of bipolar winds is critical to understand the stellar evolution from the AGB to the planetary nebula (PN) phase. Many uncertainties exist regarding the presence and role of binary stellar systems, mainly due to the deficit of conclusive observational evidences. We investigate the extended equatorial distribution around the early bipolar planetary nebula M 2-9 ("Minkowski's Butterfly Nebula") to gather new information on the mechanism of the axial ejections. Interferometric millimeter observations of molecular emission provide the most comprehensive view of the equatorial mass distribution and kinematics in early PNe. Here we present subarcsecond angular-resolution observations of the 12CO J=2-1 line and continuum emission with the Plateau de Bure interferometer. The data reveal two ring-shaped and eccentric structures at the equatorial basis of the two coaxial optical lobes. The two rings were formed during short mass-loss episodes (~ 40 yr), separated by ~ 500 yr. Their positional and dynamical imprints provide evidence of the presence of a binary stellar system at the center, which yields critical information on its orbital characteristics, including a mass estimate for the secondary of ~< 0.2 \ms. The presence of a stellar system with a modest-mass companion at the center of such an elongated bipolar PN strongly supports the binary-based models, because these are more easily able to explain the frequent axisymmetric ejections in PNe.Comment: 8 page

    Optimising Antibiotic Treatments with Multi-objective Population-based Algorithms

    Get PDF
    Antibiotic resistance is one of the major challenges that we are facing today. The frequent overuse of antibiotics is one of the main reasons for the development of resistance. A mathematical model of bacterial population dynamics is used, where drug administration and absorption mechanics are implemented to evaluate the fitness of automatically designed treatments. To maximise the probability of curing the host while minimising the total drug used we have explored treatments with different daily dosages and lengths. Two multi-objective population-based methods, a well-known evolutionary algorithm and a particle swarm optimisation algorithm are tuned and contrasted when solving the posed treatment design problem. The best solutions found by our approach suggest treatments ranging from five to seven days with a high initial dose, followed by lower doses, use lower amounts of the drug than the standard common practice of fixed daily dosages over ten days
    • …
    corecore